# Text summarization
Pino Bigbird Roberta Base
Pino is a Dutch pre-trained model based on the BigBird architecture, utilizing sparse attention mechanisms to handle long sequence texts, supporting sequences up to 4096 in length.
Large Language Model Other
P
flax-community
17
2
Swe Gpt Wiki
This is a Swedish GPT2-style model trained using the Flax CLM process, with training data from the Swedish portion of the wiki40b dataset.
Large Language Model Other
S
flax-community
24
3
T5 Base
Apache-2.0
The T5 Base Version is a text-to-text Transformer model developed by Google with 220 million parameters, supporting multilingual NLP tasks.
Large Language Model Supports Multiple Languages
T
google-t5
5.4M
702
Featured Recommended AI Models